1,950 research outputs found

    Digitizing Darwin's Library

    Get PDF
    This project which aims to reconstruct, digitally, Charles Darwin's working library as it stood at the end of his life's journey, will open up and make accessible to students of the humanities and the sciences whole new dimensions of Darwin's thinking. Over 700 of Darwin's most heavily annotated books are held at Cambridge University Library. The abundant hand-written notes on these books were painstakingly transcribed in the late 1980s. Now, thanks to high-resolution digital imagery, and an international partnership of Cambridge, the Natural History Museum in London, the Biodiversity Heritage Library-a consortium of natural history libraries, and the Darwin Digital Library of Evolution-an online scholarly edition of Darwin's manuscripts based at the American Museum of Natural History, Darwin's transcribed marginalia will be digitally married with scanned books from his own library and with scanned surrogate volumes of the exact editions Darwin owned from the partnership's libraries

    On Scalable Particle Markov Chain Monte Carlo

    Full text link
    Particle Markov Chain Monte Carlo (PMCMC) is a general approach to carry out Bayesian inference in non-linear and non-Gaussian state space models. Our article shows how to scale up PMCMC in terms of the number of observations and parameters by expressing the target density of the PMCMC in terms of the basic uniform or standard normal random numbers, instead of the particles, used in the sequential Monte Carlo algorithm. Parameters that can be drawn efficiently conditional on the particles are generated by particle Gibbs. All the other parameters are drawn by conditioning on the basic uniform or standard normal random variables; e.g. parameters that are highly correlated with the states, or parameters whose generation is expensive when conditioning on the states. The performance of this hybrid sampler is investigated empirically by applying it to univariate and multivariate stochastic volatility models having both a large number of parameters and a large number of latent states and shows that it is much more efficient than competing PMCMC methods. We also show that the proposed hybrid sampler is ergodic

    Mixed Marginal Copula Modeling

    Get PDF
    This article extends the literature on copulas with discrete or continuous marginals to the case where some of the marginals are a mixture of discrete and continuous components. We do so by carefully defining the likelihood as the density of the observations with respect to a mixed measure. The treatment is quite general, although we focus focus on mixtures of Gaussian and Archimedean copulas. The inference is Bayesian with the estimation carried out by Markov chain Monte Carlo. We illustrate the methodology and algorithms by applying them to estimate a multivariate income dynamics model.Comment: 46 pages, 8 tables and 4 figure

    Variational Bayes with Intractable Likelihood

    Full text link
    Variational Bayes (VB) is rapidly becoming a popular tool for Bayesian inference in statistical modeling. However, the existing VB algorithms are restricted to cases where the likelihood is tractable, which precludes the use of VB in many interesting situations such as in state space models and in approximate Bayesian computation (ABC), where application of VB methods was previously impossible. This paper extends the scope of application of VB to cases where the likelihood is intractable, but can be estimated unbiasedly. The proposed VB method therefore makes it possible to carry out Bayesian inference in many statistical applications, including state space models and ABC. The method is generic in the sense that it can be applied to almost all statistical models without requiring too much model-based derivation, which is a drawback of many existing VB algorithms. We also show how the proposed method can be used to obtain highly accurate VB approximations of marginal posterior distributions.Comment: 40 pages, 6 figure

    Bayesian Deep Net GLM and GLMM

    Full text link
    Deep feedforward neural networks (DFNNs) are a powerful tool for functional approximation. We describe flexible versions of generalized linear and generalized linear mixed models incorporating basis functions formed by a DFNN. The consideration of neural networks with random effects is not widely used in the literature, perhaps because of the computational challenges of incorporating subject specific parameters into already complex models. Efficient computational methods for high-dimensional Bayesian inference are developed using Gaussian variational approximation, with a parsimonious but flexible factor parametrization of the covariance matrix. We implement natural gradient methods for the optimization, exploiting the factor structure of the variational covariance matrix in computation of the natural gradient. Our flexible DFNN models and Bayesian inference approach lead to a regression and classification method that has a high prediction accuracy, and is able to quantify the prediction uncertainty in a principled and convenient way. We also describe how to perform variable selection in our deep learning method. The proposed methods are illustrated in a wide range of simulated and real-data examples, and the results compare favourably to a state of the art flexible regression and classification method in the statistical literature, the Bayesian additive regression trees (BART) method. User-friendly software packages in Matlab, R and Python implementing the proposed methods are available at https://github.com/VBayesLabComment: 35 pages, 7 figure, 10 table
    corecore